Generalization ability of Boolean functions implemented in feedforward neural networks

نویسنده

  • Leonardo Franco
چکیده

We introduce a measure for the complexity of Boolean functions that is highly correlated with the generalization ability that could be obtained when the functions are implemented on feedforward neural networks. The measure, based on the calculation of the number of neighbour examples that differ in their output value, can be simple computed from the definition of the functions, independently of their implementation. Numerical simulations performed on different architectures show a good agreement between the estimated complexity and the generalization ability and training times obtained. The proposed measure could help as an useful tool for carrying a systematic study of the computational capabilities of network architectures by classifying in an easy and reliable way the Boolean functions. Also, based on the fact that the average generalization ability computed over the whole set of Boolean functions is 0.5, a very complex set of functions was found for which the generalization ability is lower than for random functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A measure for the complexity of Boolean functions related to their implementation in neural networks

We define a measure for the complexity of Boolean functions related to their implementation in neural networks, and in particular close related to the generalization ability that could be obtained through the learning process. The measure is computed through the calculus of the number of neighbor examples that differ in their output value. Pairs of these examples have been previously shown to b...

متن کامل

Role of Function Complexity and Network Size in the Generalization Ability of Feedforward Networks

The generalization ability of different sizes architectures with one and two hidden layers trained with backpropagation combined with early stopping have been analyzed. The dependence of the generalization process on the complexity of the function being implemented is studied using a recently introduced measure for the complexity of Boolean functions. For a whole set of Boolean symmetric functi...

متن کامل

The influence of opposite examples and randomness on the generalization complexity of Boolean functions

We analyze Boolean functions using a recently proposed measure of their complexity. This complexity measure, motivated by the aim of relating the complexity of the functions with the generalization ability that can be obtained when the functions are implemented in feed-forward neural networks, is the sum of two components. The first of these is related to the ‘average sensitivity’ of the functi...

متن کامل

Generalization and Modularity in Feed-Forward Boolean Networks

We construct a family of architectures that implements the parity function in feedforward neural networks. The resulting networks have a modular architecture where the degree of modularity can be controlled by the fan-in max allowed. Among different features we analyze the generalization ability of these structures, obtaining analytical results for an arbitrary number of input bits. Both analyt...

متن کامل

Constructive Training Methods for feedforward Neural Networks with Binary weights

Quantization of the parameters of a Perceptron is a central problem in hardware implementation of neural networks using a numerical technology. A neural model with each weight limited to a small integer range will require little surface of silicon. Moreover, according to Occam's razor principle, better generalization abilities can be expected from a simpler computational model. The price to pay...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 70  شماره 

صفحات  -

تاریخ انتشار 2006